1 00:00:02,416 --> 00:00:05,616 >> Pat Ryan: The Expedition 36 Crew is getting set 2 00:00:05,616 --> 00:00:09,256 to execute the first space-based test of the system 3 00:00:09,256 --> 00:00:13,486 to permit control of a rover on the ground by a crew member 4 00:00:13,486 --> 00:00:15,566 on board an orbiting space craft. 5 00:00:15,846 --> 00:00:18,536 In this case today it will be Flight Engineer Chris Cassidy 6 00:00:18,796 --> 00:00:21,596 controlling the activity of a rover called K10, 7 00:00:21,596 --> 00:00:24,126 that you see here which is located 8 00:00:24,126 --> 00:00:27,536 at NASA's Ames Research Center in Moffett Field, California. 9 00:00:28,186 --> 00:00:30,646 Someday though, it might be an astronaut who's 10 00:00:30,646 --> 00:00:33,716 at the L2 Lagrange Point running a robot on the moon. 11 00:00:33,846 --> 00:00:38,436 The investigation called Surface Telerobotics aims to find 12 00:00:38,436 --> 00:00:40,136 out how effectively a person 13 00:00:40,136 --> 00:00:42,666 on orbit can operate a robot on the ground. 14 00:00:43,186 --> 00:00:47,116 Earlier I spoke with the Payload Developer Maria Bualat at Ames 15 00:00:47,386 --> 00:00:49,676 about today's operation and the background 16 00:00:49,676 --> 00:00:50,586 of this investigation. 17 00:00:50,706 --> 00:00:52,906 Well, let's start at the beginning. 18 00:00:53,096 --> 00:00:54,926 Where did this idea come from? 19 00:00:54,976 --> 00:00:58,566 Tell me why we think it would be a good idea for an astronaut 20 00:00:58,566 --> 00:01:02,446 in space to be able to control a robot down on a planet 21 00:01:02,446 --> 00:01:03,976 or a moon or an asteroid? 22 00:01:05,016 --> 00:01:06,606 >> Maria Bualat: Well, I'm not sure exactly 23 00:01:06,606 --> 00:01:08,166 where the original idea comes from 24 00:01:08,166 --> 00:01:10,086 but we've been working a lot 25 00:01:10,086 --> 00:01:13,156 on the human exploration architectures, looking at ways 26 00:01:13,156 --> 00:01:17,196 that robotics, what technologies are needed to enable those 27 00:01:17,806 --> 00:01:21,386 and one of the hardest parts of any planetary mission is 28 00:01:21,386 --> 00:01:26,296 to safely land on the surface and a robot 29 00:01:26,296 --> 00:01:29,416 on the surface controlled by a crew, say in an orbiting 30 00:01:29,416 --> 00:01:32,406 or approaching vehicle can get a lot of the sort 31 00:01:32,406 --> 00:01:35,596 of precursor exploration work done. 32 00:01:35,966 --> 00:01:37,496 A robot can be used, for example, 33 00:01:37,496 --> 00:01:40,526 to prepare a landing site so they could scout 34 00:01:40,526 --> 00:01:43,566 for a clear area, make sure the ground is firm 35 00:01:43,946 --> 00:01:46,156 or even perhaps build a landing strip. 36 00:01:46,796 --> 00:01:48,396 >> Pat Ryan: But a robot couldn't do that all 37 00:01:48,396 --> 00:01:50,016 on preprogrammed instructions. 38 00:01:50,016 --> 00:01:51,096 It would need guidance. 39 00:01:51,266 --> 00:01:51,736 >> Maria Bualat: Right. 40 00:01:52,046 --> 00:01:52,206 Yes. 41 00:01:52,466 --> 00:01:54,456 >> Pat Ryan: Now most people are familiar with the idea 42 00:01:54,456 --> 00:01:56,296 of a remote control of a machine. 43 00:01:56,296 --> 00:01:59,106 I mean kids have remote control planes and cars 44 00:01:59,546 --> 00:02:03,506 but what are the problems that you're facing in this situation? 45 00:02:03,506 --> 00:02:05,576 What makes it hard for an astronaut in space 46 00:02:05,736 --> 00:02:08,376 to control the rover on the ground? 47 00:02:08,946 --> 00:02:11,996 >> Well first off there's a communications delay 48 00:02:12,046 --> 00:02:15,716 between say, the station and rover on the ground. 49 00:02:15,716 --> 00:02:19,326 In the case of Space Station and here on earth it's a second 50 00:02:19,326 --> 00:02:23,156 or two and it just makes it very difficult to joy stick. 51 00:02:23,156 --> 00:02:26,716 That delay just adds to the amount of concentration 52 00:02:26,716 --> 00:02:28,366 that you need in order to control it. 53 00:02:28,796 --> 00:02:31,166 So we use something called supervisory control. 54 00:02:31,576 --> 00:02:33,066 So our robot's pretty smart. 55 00:02:33,066 --> 00:02:36,296 It can perform tasks, it can keep itself safe 56 00:02:36,296 --> 00:02:38,456 and then the astronaut can take 57 00:02:38,456 --> 00:02:41,236 over if the rover runs into any trouble. 58 00:02:41,236 --> 00:02:43,816 So it doesn't quite know how to get around something 59 00:02:43,816 --> 00:02:46,836 or it isn't collecting the correct data. 60 00:02:46,836 --> 00:02:49,866 So we have the crew member monitoring what the robot's 61 00:02:49,866 --> 00:02:53,056 doing and that's a little less of a direct control 62 00:02:53,056 --> 00:02:56,096 and so the delay doesn't really interfere 63 00:02:56,096 --> 00:02:57,246 with that type of control. 64 00:02:57,736 --> 00:02:59,996 Another factor is the space environment. 65 00:03:00,056 --> 00:03:01,786 So for example the weightlessness, 66 00:03:01,786 --> 00:03:03,476 the radiation exposure, 67 00:03:03,476 --> 00:03:06,686 stress factors they can affect human performance 68 00:03:06,686 --> 00:03:09,356 and make it hard to understand the state of the robot. 69 00:03:09,906 --> 00:03:11,766 So our user interface is designed 70 00:03:11,766 --> 00:03:14,546 to make the rover state as clear as possible. 71 00:03:15,366 --> 00:03:16,746 >> Pat Ryan: The crew member, 72 00:03:16,746 --> 00:03:18,126 what kind of feedback do they have? 73 00:03:18,126 --> 00:03:22,176 Is it just visual or does the robot talk to them? 74 00:03:22,326 --> 00:03:24,176 >> Maria Bualat: Yes the robot sends telemetry 75 00:03:24,176 --> 00:03:27,896 so it sends back information about its position, 76 00:03:28,196 --> 00:03:30,576 about its different sub-systems. 77 00:03:30,576 --> 00:03:33,106 So for example what the battery level is, 78 00:03:33,536 --> 00:03:38,256 how the instruments are working, how it's pointed 79 00:03:38,596 --> 00:03:43,596 and also imagery, so the rover uses stereo cameras in the front 80 00:03:43,746 --> 00:03:47,636 that give it information about obstacles that are in front. 81 00:03:47,636 --> 00:03:49,146 And so we can use that imagery 82 00:03:49,146 --> 00:03:51,266 to let the crew see what the rover is seeing. 83 00:03:51,686 --> 00:03:54,686 We also generate virtual terrain. 84 00:03:54,756 --> 00:03:55,776 So we show the robot 85 00:03:55,776 --> 00:04:00,236 in a virtual environment that's the train data has created using 86 00:04:00,236 --> 00:04:03,846 those stereo cameras and also a LiDAR uses a laser 87 00:04:04,326 --> 00:04:07,186 to understand the three dimensional terrain 88 00:04:07,186 --> 00:04:08,096 around the robot. 89 00:04:08,846 --> 00:04:10,436 >> Pat Ryan: Is this interaction 90 00:04:10,436 --> 00:04:11,916 between them pretty well understood? 91 00:04:11,916 --> 00:04:14,826 Are you expecting to learn something in this test 92 00:04:14,826 --> 00:04:16,216 that will let you refine it? 93 00:04:16,796 --> 00:04:19,496 >> Maria Bualat: We want to see how a person in weightlessness 94 00:04:19,496 --> 00:04:21,776 and space reacts to this system. 95 00:04:22,006 --> 00:04:24,446 We've done a lot of work with it on the ground 96 00:04:24,446 --> 00:04:28,356 but we've never done any kind of testing in space. 97 00:04:28,716 --> 00:04:29,046 >> Pat Ryan: It sounds 98 00:04:29,046 --> 00:04:31,426 like you're testing Chris Cassidy more 99 00:04:31,426 --> 00:04:32,506 than you're testing your rover. 100 00:04:32,636 --> 00:04:33,746 >> Maria Bualat: Not so much, 101 00:04:33,746 --> 00:04:36,426 I mean we will be asking him questions. 102 00:04:36,426 --> 00:04:41,416 We'll say you know something like is the robot able 103 00:04:41,416 --> 00:04:42,476 to drive forward one meter? 104 00:04:42,476 --> 00:04:43,976 Will it encounter an obstacle? 105 00:04:43,976 --> 00:04:45,056 What's its battery level? 106 00:04:45,056 --> 00:04:46,306 And we're just trying to understand 107 00:04:46,616 --> 00:04:51,486 if he can get a good situational awareness from our interface. 108 00:04:51,996 --> 00:04:55,676 So in a way we are testing him but it's to give him an insight 109 00:04:55,676 --> 00:04:58,026 into how well our interfaces work. 110 00:04:58,646 --> 00:05:02,486 >> Pat Ryan: In the case of this test is there a planned sequence 111 00:05:02,486 --> 00:05:06,266 of events or is he just going to give it random commands? 112 00:05:06,626 --> 00:05:07,706 >> Maria Bualat: We have a set 113 00:05:07,706 --> 00:05:09,636 of pre-planned sequences for the robot. 114 00:05:09,666 --> 00:05:12,176 So the robot has its mission. 115 00:05:12,176 --> 00:05:16,136 We are going to simulate deploying a radio telescope 116 00:05:16,136 --> 00:05:20,516 on the far side of the moon and so the idea is 117 00:05:20,516 --> 00:05:25,156 that we'll have had orbital data of the area we're interested in 118 00:05:25,446 --> 00:05:29,926 and so ground teams will have created plans for the robot 119 00:05:29,926 --> 00:05:34,836 and the ideas that when Chris sends the command to the robot 120 00:05:34,836 --> 00:05:37,106 and starts it executing he'll just make sure 121 00:05:37,106 --> 00:05:40,606 that it's not encountering anything that it can't handle. 122 00:05:40,846 --> 00:05:43,036 >> Pat Ryan: How does he control it? 123 00:05:43,036 --> 00:05:46,576 Does he give it voice commands or joy sticks 124 00:05:46,576 --> 00:05:47,506 or something like that? 125 00:05:47,666 --> 00:05:48,576 >> Maria Bualat: No we are using, 126 00:05:48,576 --> 00:05:50,246 it's a graphical user interface. 127 00:05:50,476 --> 00:05:54,216 So he'll see, as I mentioned, live images of the rover cameras 128 00:05:54,436 --> 00:05:58,466 as well as, there's a couple of 3-D virtual views of the robot. 129 00:05:58,916 --> 00:06:01,876 The robot has several 3-D sensors. 130 00:06:01,916 --> 00:06:04,776 So I mentioned the stereo cameras and the LiDAR 131 00:06:05,256 --> 00:06:07,306 and then our system uses that information 132 00:06:07,306 --> 00:06:11,236 to create 3-D virtual terrains and then we have a model 133 00:06:11,236 --> 00:06:15,036 of the robot in that terrain and that will display to Chris 134 00:06:15,736 --> 00:06:18,696 where the obstacles are and he can use it 135 00:06:18,696 --> 00:06:20,256 to visualize what the robot is doing. 136 00:06:20,966 --> 00:06:24,226 >> Pat Ryan: And he sends the commands in what way? 137 00:06:24,516 --> 00:06:27,966 >> Maria Bualat: Basically buttons, button presses. 138 00:06:27,966 --> 00:06:31,246 There's some preset commands. 139 00:06:31,396 --> 00:06:35,576 So for example, drive one meter forward, rotate 15 degrees 140 00:06:35,576 --> 00:06:38,136 to the right, take another panorama, 141 00:06:38,536 --> 00:06:42,336 so fairly simple full commands. 142 00:06:42,336 --> 00:06:45,946 >> Pat Ryan: And today's task is quite lengthy in fact, right? 143 00:06:46,186 --> 00:06:46,726 >> Maria Bualat: Yes. 144 00:06:46,896 --> 00:06:49,026 We are, I believe we have a two 145 00:06:49,026 --> 00:06:51,316 and a half hour block for operations. 146 00:06:51,476 --> 00:06:53,346 Before that we will be doing a little bit of training 147 00:06:53,346 --> 00:06:54,676 on the user interface. 148 00:06:55,536 --> 00:06:57,906 >> Pat Ryan: At the end of the day what is it 149 00:06:57,966 --> 00:06:58,926 that you hope to learn? 150 00:06:58,926 --> 00:07:01,116 What's going to be the next step in this development? 151 00:07:01,546 --> 00:07:03,446 >> Maria Bualat: Well we have two more crew sessions 152 00:07:03,446 --> 00:07:07,136 after this through the summer, so roughly one a month. 153 00:07:07,136 --> 00:07:09,056 So we'll continue testing, 154 00:07:09,056 --> 00:07:11,636 continue collecting data on the systems. 155 00:07:12,166 --> 00:07:13,456 We're not just going to look 156 00:07:13,456 --> 00:07:19,646 at how well the crew member can control the robot but also some 157 00:07:19,646 --> 00:07:24,136 of our COMM systems, what sorts of delays we're seeing. 158 00:07:24,216 --> 00:07:26,516 So we're also looking at some other technologies. 159 00:07:26,986 --> 00:07:30,546 And then after that we're going to look at, analyze that data, 160 00:07:30,616 --> 00:07:34,496 see how well our systems work, where we can improve and also 161 00:07:34,496 --> 00:07:36,656 where are the gaps in current technologies? 162 00:07:36,656 --> 00:07:39,466 So in other words, what new technologies do we need? 163 00:07:39,466 --> 00:07:42,366 >> Pat Ryan: It sounds like it will be fun 164 00:07:42,366 --> 00:07:43,436 and interesting to watch. 165 00:07:44,506 --> 00:07:45,126 >> Maria Bualat: It should be. 166 00:07:45,126 --> 00:07:48,016 Robots are usually fun to watch since they're running around 167 00:07:48,656 --> 00:07:51,416 and a lot of people tend to relate to them. 168 00:07:52,296 --> 00:07:53,526 >> Pat Ryan: Maria, 169 00:07:53,526 --> 00:07:55,876 really appreciate your taking the time for the updating. 170 00:07:55,876 --> 00:07:56,806 Good luck with the tests. 171 00:07:56,916 --> 00:07:57,636 >> Maria Bualat: Thank you very much. 172 00:07:57,896 --> 00:08:00,006 >> Pat Ryan: Maria Bualat is the Payload Developer 173 00:08:00,006 --> 00:08:01,496 and Project Technical Lead 174 00:08:01,706 --> 00:08:03,916 for the Surface Telerobotics Investigation